Many problems in computational neuroscience, neuroinformatics, pattern/imagerecognition, signal processing and machine learning generate massive amounts ofmultidimensional data with multiple aspects and high dimensionality. Tensors(i.e., multi-way arrays) provide often a natural and compact representation forsuch massive multidimensional data via suitable low-rank approximations. Bigdata analytics require novel technologies to efficiently process huge datasetswithin tolerable elapsed times. Such a new emerging technology formultidimensional big data is a multiway analysis via tensor networks (TNs) andtensor decompositions (TDs) which represent tensors by sets of factor(component) matrices and lower-order (core) tensors. Dynamic tensor analysisallows us to discover meaningful hidden structures of complex data and toperform generalizations by capturing multi-linear and multi-aspectrelationships. We will discuss some fundamental TN models, their mathematicaland graphical descriptions and associated learning algorithms for large-scaleTDs and TNs, with many potential applications including: Anomaly detection,feature extraction, classification, cluster analysis, data fusion andintegration, pattern recognition, predictive modeling, regression, time seriesanalysis and multiway component analysis. Keywords: Large-scale HOSVD, Tensor decompositions, CPD, Tucker models,Hierarchical Tucker (HT) decomposition, low-rank tensor approximations (LRA),Tensorization/Quantization, tensor train (TT/QTT) - Matrix Product States(MPS), Matrix Product Operator (MPO), DMRG, Strong Kronecker Product (SKP).
展开▼